Smoothing Projected Gradient Method and Its Application to Stochastic Linear Complementarity Problems

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smoothing Projected Gradient Method and Its Application to Stochastic Linear Complementarity Problems

A smoothing projected gradient (SPG) method is proposed for the minimization problem on a closed convex set, where the objective function is locally Lipschitz continuous but nonconvex, nondifferentiable. We show that any accumulation point generated by the SPG method is a stationary point associated with the smoothing function used in the method, which is a Clarke stationary point in many appli...

متن کامل

A Projected Algebraic Multigrid Method for Linear Complementarity Problems

We present an algebraic version of an iterative multigrid method for obstacle problems, called projected algebraic multigrid (PAMG) here. We show that classical algebraic multigrid algorithms can easily be extended to deal with this kind of problem. This paves the way for efficient multigrid solution of obstacle problems with partial differential equations arising, for example, in financial eng...

متن کامل

Expected Residual Minimization Method for Stochastic Linear Complementarity Problems

This paper presents a new formulation for the stochastic linear complementarity problem (SLCP), which aims at minimizing an expected residual defined by an NCP function. We generate observations by the quasi-Monte Carlo methods and prove that every accumulation point of minimizers of discrete approximation problems is a minimum expected residual solution of the SLCP. We show that a sufficient c...

متن کامل

Spectral projected gradient method for stochastic optimization

We consider the Spectral Projected Gradient method for solving constrained optimization porblems with the objective function in the form of mathematical expectation. It is assumed that the feasible set is convex, closed and easy to project on. The objective function is approximated by a sequence of Sample Average Approximation functions with different sample sizes. The sample size update is bas...

متن کامل

Smoothing methods for convex inequalities and linear complementarity problems

A smooth approximation p(x;) to the plus function: maxfx; 0g, is obtained by integrating the sigmoid function 1=(1 + e ?x), commonly used in neural networks. By means of this approximation, linear and convex inequalities are converted into smooth, convex unconstrained minimization problems, the solution of which approximates the solution of the original problem to a high degree of accuracy for ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Optimization

سال: 2009

ISSN: 1052-6234,1095-7189

DOI: 10.1137/070702187